Evaluating code-based test input generator tools

نویسندگان

  • Lajos Cseppento
  • Zoltán Micskei
چکیده

In recent years several tools have been developed to automatically select test inputs from the code of the system under test. However, each of these tools has different advantages, and there is little detailed feedback available on the actual capabilities of the various tools. In order to evaluate test input generators this paper collects a set of programming language concepts that should be handled by the tools, and maps these core concepts and challenging features like handling the environment or multi-threading to 363 code snippets respectively. These snippets would serve as inputs for the tools. Next, the paper presents SETTE, an automated framework to execute and evaluate these snippets. Using SETTE multiple experiments were performed on five Java and one .NET-based tools using symbolic execution, search-based and random techniques. The test suites’ coverage, size, generation time and mutation score were compared. The results highlight the strengths and weaknesses of each tool and approach, and identify hard code parts that are difficult to tackle for most of the tools. We hope that this research could serve as actionable feedback to tool developers and help practitioners assess the readiness of test input generation. Copyright c © 2016 John Wiley & Sons, Ltd.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

An automatic test case generator for evaluating implementation of access control policies

One of the main requirements for providing software security is the enforcement of access control policies which aim to protect resources of the system against unauthorized accesses. Any error in the implementation of such policies may lead to undesirable outcomes. For testing the implementation of access control policies, it is preferred to use automated methods which are faster and more relia...

متن کامل

Automated Input Generator for Android Applications

Android applications have been tested without any knowledge about them using a variety of tools such as Monkey [2], Monkey Runner [3] etc. In this paper, we evaluate existing Android testing techniques by comparing each of these tools and evaluate their efficiency based on a number of factors. Next, we propose requirements of an ideal input generator and present an automated input generator usi...

متن کامل

VHDL Code Generator for a Complex Multiplier

In this paper we present a VHDL code generator for a complex multiplier. The complex multiplier is based on a bit-parallel version of distributed arithmetic which reduces the hardware by nearly half compared to a straightforward implementation based on real multipliers. We choose an Overturned-Stairs adder tree to perform the summations in distributed arithmetic. The tree has a regular structur...

متن کامل

Code Generator Testing in Practice

This paper provides an overview of a practice-oriented testing approach for code generation tools. The main application area for the testing approach presented here is the testing of optimisations performed by the code generator. The test models and corresponding test vectors generated represent an important component in a comprehensive test suite for code generators.

متن کامل

TUnit - Unit Testing For Template-based Code Generators

Template-based code generator development as part of model-driven development (MDD) demands for strong mechanisms and tools that support developers to improve robustness, i.e., the desired code is generated for the specified inputs. Although different testing methods have been proposed, a method for testing only parts of template-based code generators that can be employed in the early stage of ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Softw. Test., Verif. Reliab.

دوره 27  شماره 

صفحات  -

تاریخ انتشار 2017